Zusammenfassung.
Die Patientensicherheit und Fehler in der Medizin rücken zunehmend in das öffentliche
Interesse. Nach neuen Untersuchungen sind medizinische Fehler unter den zehn häufigsten
Todesursachen. Eine neue Welle der Beschäftigung mit Fehlern und deren Ursachen auf
der Systemebene des Gesundheitswesens hat begonnen. Das Fachgebiet der Anästhesiologie
gilt dabei als Vorbild im Bemühen um eine systematische Erhöhung der Patientensicherheit.
Dies ist Auszeichnung und Auftrag zugleich. In anderen Hochrisikobereichen mit hohem
Anspruch an Systemsicherheit (Kernkraft, Flugsicherheit) haben sich zahlreiche Strategien
zur Erhöhung der Sicherheit bewährt. Es scheint an der Zeit, diese Strategien für
die Anwendung im Bereich der Medizin zu überprüfen und gegebenenfalls entsprechend
anzupassen und umzusetzen. Hierzu gehören die Vermittlung der Kenntnis, wie Fehler
in komplexen Systemen entstehen und welche Fehlerarten es gibt; die Einführung von
Erfassungssystemen für unerwünschte Ereignisse, die frei von negativen Konsequenzen
für die Berichtenden sein müssen; die Förderung der kontinuierlichen Ausbildung und
die Entwicklung von allgemeinen Problemlösekompetenzen und schließlich der größtmögliche
Einsatz von Trainingssimulatoren. Wichtigster Faktor zur langfristigen Erhöhung der
Patientensicherheit ist aber ein „Kulturwandel”. Diese Kultur der personenbezogenen
Verurteilung („Culture of Blame”), sollte einer offenen Sicherheitskultur („Safety
Culture”) weichen, die Fehler und Zwischenfälle als Problem des Gesamtsystems sieht.
Das Akzeptieren der menschlichen Fehleranfälligkeit und die offene Analyse von Fehlern
ohne persönliche Schuldzuweisungen, im Sinne einer „präventiven Fehlerkultur”, sollte
dann auch zu Lösungen auf Systemebene führen. Dieser Kulturwandel kann nur mit hohem
Engagement von höchster Ebene vollzogen werden, indem Patientensicherheit explizit
zum höchsten Ziel erklärt werden: „Primum nihil nocere” - „Das Wichtigste ist: Schade
nicht”.
Patient Safety and Errors in Medicine: Development, Prevention and Analyses of Incidents.
“Patient safety” and “errors in medicine” are issues gaining more and more prominence
in the eyes of the public. According to newer studies, errors in medicine are among
the ten major causes of death in association with the whole area of health care. A
new era has begun incorporating attention to a “systems” approach to deal with errors
and their causes in the health system. In other high-risk domains with a high demand
for safety (such as the nuclear power industry and aviation) many strategies to enhance
safety have been established. It is time to study these strategies, to adapt them
if necessary and apply them to the field of medicine. These strategies include: to
teach people how errors evolve in complex working domains and how types of errors
are classified; the introduction of critical incident reporting systems that are free
of negative consequences for the reporters; the promotion of continuous medical education;
and the development of generic problem-solving skills incorporating the extensive
use of realistic simulators wherever possible. Interestingly, the field of anesthesiology
- within which realistic simulators were developed - is referred to as a model for
the new patient safety movement. Despite this proud track record in recent times though,
there is still much to be done even in the field of anesthesiology. Overall though,
the most important strategy towards a long-term improvement in patient safety will
be a change of “culture” throughout the entire health care system. The “culture of
blame” focused on individuals should be replaced by a “safety culture”, that sees
errors and critical incidents as a problem of the whole organization. The acceptance
of human fallability and an open-minded non-punitive analysis of errors in the sense
of a “preventive and proactive safety culture” should lead to solutions at the systemic
level. This change in culture can only be achieved with a strong commitment from the
highest levels of an organization. Patient safety must have the highest priority in
the goals of the institution: “Primum nihil nocere” - “First, do not harm”.
Schlüsselwörter:
Sicherheitskultur - Zwischenfälle - CRM Crisis Resource Management
Key words:
Safety Culture - Critical incidents - CRM
Literatur
1
Berwick D M L.
Reducing errors in medicine (editorial).
BMJ.
1999;
319
136-137
2
Weingart S N, Wilson R M, Gibberd R W, Harrison B.
Epidemiology of medical error.
BMJ,.
2000;
320
774-777
3 Kohn L T, Corrigan J M, Donaldson M S. To Err is Human - Building a Safer Health
System. National Academy Press, Washington 1999
4
Reinertsen J L.
Let's talk about error [editorial].
BMJ..
320;
2000
730
5 Reducing error - Improving safety. BMJ 2000 320
6
Gaba D M.
Anaesthesiology as a model for patient safety in health care.
BMJ,.
2000;
320
785-788
7
Leape L L.
Error in medicine (see comments).
JAMA..
1994;
272
1851-1857
8
Decker K, Rall M.
Simulation in anaesthesia: a step towards improved patient safety.
Min Invas Ther & Allied Technol..
2000;
9
325-332
9
Cooper J B, Newbower R S, Long C D, McPeek B.
Preventable anesthesia mishaps: a study of human factors.
Anesthesiology.
1978;
49
399-406
10
Cooper J B.
Toward prevention of anesthetic mishaps.
Int Anesthesiol Clin..
1984;
22
167-183
11
Cooper J B, Newbower R S, Kitz R J.
An analysis of major errors and equipment failures in anesthesia management: considerations
for prevention and detection.
Anesthesiology.
1984;
60
34-42
12
Cooper J B.
Gaba DM. A strategy for preventing anesthesia accidents.
Int Anesthesiol Clin.
1989;
27
148-152
13
Gaba D M, Maxwell M, DeAnda A.
Anesthetic mishaps: breaking the chain of accident evolution.
Anesthesiology.
1987;
66
670-676
14
Webb R K, Currie M, Morgan C A, Williamson J A, Mackay P, Russell W J, Runciman W B.
The Australian Incident Monitoring Study: an analysis of 2000 incident reports.
Anaesth Intensive Care.
1993;
21
520-528
15
Williamson J A, Webb R K, Sellen A, Runciman W B, Van der Walt J H.
The Australian Incident Monitoring Study. Human failure: an analysis of 2000 incident
reports.
Anaesth Intensive Care,.
1993;
21
678-683
16
Gaba D M, DeAnda A.
A comprehensive anesthesia simulation environment: re-creating the operating room
for research and training.
Anesthesiology.
1988;
69
387-394
17
Gaba D M.
DeAnda A. The response of anesthesia trainees to simulated critical incidents.
Anesth Analg,.
1989;
68
444-451
18 Gaba D M. The human work environment and anesthesia simulators. Miller, R.D., ed
In Anesthesia, Churchill-Livingstone, Philadelphia 2000
19 Maurino D E, Reason J, Johnston N, Lee R B. Beyond Aviation Human Factors. Ashgate,
Aldershot 1995
20 Norman D A. The Psychology of Everyday Things. BasicBooks 1988
21 Perrow C. Normal Accidents. Princeton University Press, Princeton 1999
22 Reason J. Human error. Cambridge 1994
23 Reason J. Managing the Risks of Organizational Accidents. Ashgate, Aldershot 1997
24 Hartmannsgruber & Good .Anesthesia simulators and training devices. Anaesthesist
1993
25 Morell C M. Erickson JI. Patient Safety in Anesthetic Practice. Churchill-Livingstone,
New York 1997
26 Weizsäcker von C, Weizsäcker von E U.
Fehlerfreundlichkeit. Kornwachs, K., ed In Offenheit - Zeitliche Komplexität. Zur Theorie offener Systeme. Campus,
Frankfurt 1985
27
Nolan T W.
System changes to improve patient safety.
BMJ.
2000;
320
771-773
28 Norman D A. Things that make us smart. Perseus Books, Reading, MA 1993
29 Wehner T. Sicherheit als Fehlerfreundlichkeit. Westdeutscher Verlag, Opladen 1992
30
Weizsäcker von C, Weizsäcker von E U.
Fehlerfreundlichkeit als evolutionäres Prinzip.
Wechselwirkung.
1986;
29
12-15
31
Rasmussen J.
Skills, rules, knowledge: signals, signs and symbols and other distinctions in human
performance models.
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS.
1983;
SMC-13
257-267
32 Rall M. Anmerkungen des Übersetzers. In Zwischenfälle in der Anästhesie. Prävention
und Management. Gustav Fischer, Lübeck 1998
33
Reason J.
Human error: models and management.
BMJ.
2000;
320
768-770
34 Vaughan D. The Challenger Launch Decision. The University of Chicago Press, Chicago
1996
35
Kluwe R.
Acqusition of knowledge in the control of a simulated technical system.
Le Travail humain.
1997;
60
61-85
36 DeKeyser V, Woods D D. Fixation errors: failures to revise situation assessment
in dynamic and risky systems. Colombo AG, Bustamante AS, eds In Systems Reliability
Assessment, Kluwer Academic, Dordrecht, Germany 1990: 231
37 DeKeyser V, Woods D D, Masson M, Van Deele A.
Fixation errors in dynamic and complex systems: descriptive forms, psychological mechanisms,
potential countermeasures. Technical Report for NATO Division of Scientific Affairs. Brussels, Belgium; Ref
Type: Report 1988
38 Gaba D M, Fish K J, Howard S K. Zwischenfälle in der Anästhesie. Prävention und
Management. Gustav Fischer 1998
39
Gaba D M.
Human error in anesthetic mishaps.
Int Anesthesiol Clin.
1989;
27
137-147
40 Schwid H A, O'Donell D. Anesthesiologists' management of simulated critical incidents. Anesthesiology
1992 76
41 Rall M. Eisberg der Narkosezwischenfälle. 1997. Ref Type: Personal Communication
1997
42 Dörner D. Die Logik des Mißlingens. Rowohlt, Reinbek 1993
43 Rall M, Guggenberger H, Gaba D M. Allgemeines Management von Zwischenfällen - Praxis
der Patientensicherheit in Anästhesie, Intensiv- und Notfallmedizin. Anasthesiol Intensivmed.
Notfallmed Schmerzther, Manuskript eingereicht 2001
44
Holzman R S, Cooper J B, Gaba D M, Philip J H, Small S D, Feinstein D.
Anesthesia crisis resource management: real-life simulation training in operating
room crises.
J Clin Anesth.
1995;
7
675-687
45
Howard S K, Gaba D M, Fish K J, Yang G, Sarnquist F H.
Anesthesia crisis resource management training: teaching anesthesiologists to handle
critical incidents.
Aviat Space Environ Med.
1992;
63
763-770
46
Helmreich R L.
On error management: lessons from aviation.
BMJ,.
2000;
320
781-785
47
Barach P, Small S D.
Reporting and preventing medical mishaps: lessons from non-medical near miss reporting
systems.
BMJ,.
2000;
320
759-763
48 Hansis M L, Hansis D E. Der ärztliche Behandlungsfehler. ecomed, Landsberg 1999
49
Cohen M R.
Why error reporting systems should be voluntary [editorial].
BMJ.
2000;
320
728-729
50
Sexton J B, Thomas E J, Helmreich R L.
Error, stress, and teamwork in medicine and aviation: cross sectional surveys.
BMJ.
2000;
320
745-749
51
Leape L L, Woods D D, Hatlie M J, Kizer K W, Schroeder S A, Lundberg G D.
Promoting patient safety by preventing medical error.
JAMA JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION.
1998;
280
1444-1447
52
Berwick D M, Leape L L.
Reducing errors in medicine - It's time to take this more seriously.
BRITISH MEDICAL JOURNAL.
1999;
319
136-137
53
Leape L L, Berwick D M.
Safe health care: are we up to it? [editorial].
BMJ.
2000;
320
725-726
54 Rall M. Why do we always have to wait for deaths?. http://www. bmj. com/cgi/eletters/320/7235/598/a
2000
55 Rall M. It's time to tackle errors in medicine. http://www. bmj. com/cgi/eletters/320/7235/597
2000
Dr. Marcus Rall
Klinik für Anaesthesiologie Tübinger Patientensicherheits- und Stimulationszentrum Universitätsklinikum Tübingen
Hoppe-Seyler-Straße 3
72076 Tübingen
Email: marcus.rall@med.uni-tuebingen.de